HowTo: Configure diginsight telemetry to the remote tools
Diginsight is a very thin layer built on .Net System.Diagnostics Activity API and ILogger API.
In particular, standard .Net System.Diagnostics activities and ILogger telemetry are sent to remote tools by means of OpenTelemetry and/or Prometheus.
This enables in sending the full diginsight application flow to the remote tools.
This article discusses how we can configure Diginsight telemetry to remote tools such as Azure Monitor or Grafana.
Also, the article shows how such telemetry can be easily analyzed on Azure Monitor tools such as the Transaction Search and Transaction Detail, the Azure Monitor Metrics, Logs or Azure Monitor Dashboards.
The code snippets below are available as working samples within the telemetry.samples repository.
Article HOWTO - Use Diginsight Samples.md: explores how we can use diginsight samples to test and understand integration of Diginsight telemetry in our own projects.
STEP 01 - Add a package reference to the packages Diginsight.Diagnostics.AspNetCore.OpenTelemetry
In the first step you can just add a diginsight references to your code:
STEP 02 - Configure telemetry on the Startup sequence
The S01_02_SampleWebAPIWithOpentelemetry
sample shows an example WebApi fully integrated with OpenTelemetry and AzureMonitor.
The Program.Main
entry point activates telemetry by means of AddObservability()
.UseDiginsightServiceProvider()
as shown below.
The startup sequence is identical to the one seen for the local Console and Log4Net providers.
The difference lies in AddObservability()
that, in addition to Console and Log4Net providers, enables OpenTelemetry for AzureMonitor.
public static void Main(string[] args)
{
// this enables sending telemetry for the startup sequence
// telemetry is recorded until ServiceProvider creation
// after that, recorded telemetry is sent to the configured registered providers
// (eg. AzureMonitor, Console, Log4Net)
using var observabilityManager = new ObservabilityManager();
= observabilityManager.LoggerFactory.CreateLogger(typeof(Program));
ILogger logger .LoggerFactory = observabilityManager.LoggerFactory;
Observability
;
WebApplication appusing (var activity = Observability.ActivitySource.StartMethodActivity(logger, new { args }))
{
var builder = WebApplication.CreateBuilder(args);
var services = builder.Services;
var configuration = builder.Configuration;
var environment = builder.Environment;
// Add logging and opentelemetry providers
.AddObservability(configuration, environment, out IOpenTelemetryOptions openTelemetryOptions);
services
// registers recorded telemetry for flush after ServiceProvider creation
.AttachTo(services);
observabilityManager.TryAddSingleton<IActivityLoggingSampler, NameBasedActivityLoggingSampler>();
services
.AddControllers();
services.AddEndpointsApiExplorer();
services.AddSwaggerGen();
services
// use diginsight service provider
// this enables telemetry initialization at service provider creation
.Host.UseDiginsightServiceProvider(true);
builder= builder.Build();
app
if (app.Environment.IsDevelopment())
{
.UseSwagger();
app.UseSwaggerUI();
app}
.UseHttpsRedirection();
app
.UseAuthorization();
app
.MapControllers();
app}
.Run();
app}
in the code above: - AddObservability() configures log for the application Console, log4net file log and also for Opentelemetry. - UseDiginsightServiceProvider() is used to activate diginsight during the service provider build() process.
Please note that AddObservability() is implemented as an extension method that calls AddLogging() with: - AddDiginsightConsole(): this methods configures the Console log provider with some formatting options
- AddDiginsightLog4Net(): this methods configures a rolling File log on the user profile folder.
- services.Configure
(openTelemetryConfiguration) and
AddDiginsightOpenTelemetry(): this methods configures the OpenTelemetry provider with the AzureMonitor connection string.
In caseEnableMetrics
is set to true,openTelemetryBuilder.WithMetrics
is called to send predefined metrics such as the span_duration metric for configured methods.
In caseEnableTraces
is set to true,openTelemetryBuilder.WithTracing
is called to include ILogger traces into the opentelemetry flow sent to the remote tools.
Opentelemetry flow is regulated by Opentelemetry options that include the AzureMonitorConnectionString as well as the EnableTraces and EnableMetrics flags.
"OpenTelemetry": {
"EnableTraces": true,
"EnableMetrics": true,
"AzureMonitorConnectionString": "",
"ActivitySources": [
"Azure.Cosmos.Operation",
"Azure.Storage.Blobs.BlobBaseClient",
"Microsoft.AspNetCore",
"Diginsight.*",
"S01_02_SampleWebAPIWithOpentelemetry"
],
"Meters": [
"S01_02_SampleWebAPIWithOpentelemetry"
],
"ExcludedHttpHosts": [
"login.microsoftonline.com",
".documents.azure.com",
".applicationinsights.azure.com",
".monitor.azure.com",
".b2clogin.com"
],
"DurationMetricTags": [
"category_name",
"user_company",
"plant_name",
"plant_company",
"max_concurrency"
]
},
In the above options, the ActivitySources section includes the activity sources that are used to send telemetry to the remote tools.
For every assembly, the span_duration metric is sent with latencies of method executions.
We’ll see that Tags can be attached to the span_duration metric to allow distinguishing durations in different conditions.
The DurationMetricTags
section includes the tags allowed for the span_duration metric
STEP 03 - Add telemetry to code with StartMethodActivity()
and ILogger
Statements
We are now ready to add instrumentation to the code and make the application flow observable.
The snippet below shows how to add telemetry to the GetWeatherForecast()
method of the WeatherForecastController
class:
[HttpGet(Name = "GetWeatherForecast")]
public async Task<IEnumerable<WeatherForecast>> Get()
{
using var activity = Observability.ActivitySource.StartMethodActivity(logger);
var maxConcurrency = concurrencyOptionsMonitor.CurrentValue?.MaxConcurrency ?? -1; logger.LogDebug("maxConcurrency: {maxConcurrency}", maxConcurrency);
.SetTag("max_concurrency", maxConcurrency.ToString("D"));
activityvar options = new ParallelOptions() { MaxDegreeOfParallelism = maxConcurrency };
int[] ia = new int[20];
int index = 0;
var queue = new ConcurrentQueue<WeatherForecast>();
.ForEachAsync(ia, options, async (i, ct) =>
await Parallel{
++;
indexvar randomTemperature = Random.Shared.Next(-20, 55);
.LogDebug("index {index}, randomTemperature: {randomTemperature}", index, randomTemperature);
loggervar weatherForecast = new WeatherForecast
{
= DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
Date = randomTemperature,
TemperatureC = Summaries[Random.Shared.Next(Summaries.Length)]
Summary };
.Sleep(100);
Thread.Enqueue(weatherForecast);
queue});
var res = queue.ToArray();
?.SetOutput(res);
activityreturn res;
}
in the snippet above: - using var activity = Observability.ActivitySource.StartMethodActivity(logger);
is added to provide observability of method start and end - logger.LogDebug("randomTemperature: {randomTemperature}", randomTemperature);
is usd to log the randomTemperature value, during the method execution. - activity.SetOutput(result);
is used to add the method result to the method END event.
- activity.SetTag("max_concurrency", maxConcurrency.ToString("D"));
: is used to add the max_concurrency tag to the span_duration metric sent to the remote tools.
this will allow comparing latencies for the same method, with different max_concurrency values.
STEP 04 - run your code and look at the resulting application flow
The image below shows the application flow generated by WeatherForecastController.get
method.
The image belpw shows the sample method execution, with different maxConcurrency levels:
| maxConcurrency 1 | maxConcurrency 5 | |——————|———————| | |
|
the image below shows the corresponding transactions on the Azure Monitor.
| maxConcurrency 1 | maxConcurrency 5 | |——————|———————| | |
|
an easy query on the Azure Monitor can be used to get the span_duration metric for the WeatherForecastController.get
method for different max_concurrency values.
customMetrics
| where name == "diginsight.span_duration"
| extend maxConcurrency = coalesce(tostring(customDimensions.max_concurrency), "-1")
| extend span_name = tostring(customDimensions.span_name)
| where customDimensions["span_name"] contains "Controller"
| where customDimensions["span_name"] !contains ".ctor"
rendering a columnchart it is easy to understand that the max_concurrency value 1 is producing the highest span_durations.
customMetrics
| where name == "diginsight.span_duration"
| extend maxConcurrency = coalesce(tostring(customDimensions.max_concurrency), "-1")
| extend span_name = tostring(customDimensions.span_name)
| where customDimensions["span_name"] contains "Controller"
| where customDimensions["span_name"] !contains ".ctor"
| summarize avgSpanDuration = avg(value/valueCount) by span_name, maxConcurrency
| order by avgSpanDuration asc
| render columnchart with (kind=unstacked, xcolumn=maxConcurrency, ycolumns=avgSpanDuration)